|
Fresh Findings
As programmable systems become increasingly complex, a rich ecosystem of technology is growing up to support the diversity of new designs that take advantage of the flexibility and time-to-market advantages afforded by today's FPGA platforms. Every time the market presents a new opportunity, another startup or established player steps forward with a solution that advances the state of the art. Let's examine a few of the more recent ones in detail, starting from the beginning of the design process and walking through to the working hardware. Design Melée Management Before you can push your design down to a development board, you need to have your initial implementation and architecture in place. Whether you're assembling IP from a variety of sources, cranking out VHDL or Verilog code, or moving from Matlab or another high-level language down to hardware, you’ll need to capture and verify the functional behavior of the major components of your design. As the back-end of FPGA design (logic- and physical-synthesis, place-and-route, timing closure) becomes more automated and trouble-free, the design schedule bottleneck is squeezed forward to the beginning of the process. Portland, Oregon-based Stelar tools recently announced their solution for design teams developing and integrating HDL modules into a coherent design. Their HDL Explorer is designed to help design teams with what Stelar calls RTL closure. "RTL closure is the process of getting your design clean before synthesis," says Steve Sapiro, Vice President of Marketing at Stelar. "By analyzing and cleaning your design, you shorten your development time and reduce both functional errors and costs." Stelar contends that over 80% of designs are re-works of previous projects. With a typical large design containing thousands of modules and teams often geographically scattered and experiencing turnover, the task of understanding and checking the validity of legacy and external IP is huge. Stelar’s HDL Explorer is an intelligent graphical tool for visualizing, navigating, checking, and integrating all those modules in a way that significantly simplifies the job of understanding and unraveling the chaos. [more]
Proponents of the Field Programmable Gate Array have fought for years to overcome the "stepping stone" mentality with which the traditionalist engineering community has viewed the FPGA. Used primarily as either an ASIC prototyping platform or as a time-to-market stopgap until the company can produce a processor-based or ASIC-based system, the FPGA has only begun to prove its worth as an end-product solution. To some extent, the problem has been choosing the right battleground. FPGAs have a very specific set of value propositions which, when taken together, will easily supplant ASICs and processors in the right application. Fundamentally, the FPGA offers fast time to market, low design/manufacturing cost and risk, extremely high processing performance (especially in massively parallel processing applications), and of course, configurability. Coincidentally, these value propositions align perfectly with the requirements posed by a large and growing number of advanced imaging applications. Moreover, the growth of "FPGA Computing" and off-the-shelf products designed for this market make it easier for developers to utilize the technology. Image processing – the right battlefield! Image processing applications have traditionally pushed the data processing envelope, both in terms of the amount of data being processed and algorithmic complexity. Advances in image capture technology have recently fueled this requirement. While frame rates and resolutions are on an upward curve, the cost of the technology is falling – a potent combination, resulting in the need to process more pixels in less time. [more] | |||||||||
All material
copyright © 2003-2005 techfocus media, inc. All rights reserved. |